专利摘要:
An apparatus for determining the movement of a target, comprising at least one light sensor (12) and a light modification structure for relaying light reflected from the target to the at least one light sensor (12), the light modification structure having a plurality of layers configured in a stepped structure, and which is cofigured for selectively blocking a portion of the light.
公开号:AT15499U1
申请号:TGM50195/2015U
申请日:2012-05-03
公开日:2017-10-15
发明作者:
申请人:Maxim Integrated Products;
IPC主号:
专利说明:

Description: [0001] The invention relates to a device for determining the movement of a target, comprising at least one light sensor and a light modification structure for relaying light reflected from the target to the at least one light sensor.
Such a device may be provided in particular for determining a physical gesture.
A gesture sensor is a human interface device that enables the detection of physical movement without the user actually touching the device. The detected movements can serve as input commands for the device. The device may be programmed to contactlessly detect various hand movements, such as hand movements from left to right, from right to left, from top to bottom, from bottom to top, from inside to outside, and / or from outside to inside. Gesture sensors have become very popular in hand-held devices such as tablet computers, laptops, smart phones and other portable devices. Gesture sensors are also implemented in video game consoles to capture the motion of a player in the video game.
Proximity detectors are known from US 2008/0006762 A1, US 2010/0102230 A1 and US 5,103,085 A, and WO 2009/067103 A1 describes a tablet position detection in connection with a liquid manipulation device. An optical pointing device is further disclosed in WO 2005/088435 A1.
Conventional gesture sensors use three or more illumination sources, such as light emitting diodes (LEDs), and a light sensor, such as a photodetector. The illumination sources are lit in turn to give the sensor spatial information from the reflection of the flashing light. Fig. 1 shows a simplified block diagram of a conventional gesture sensor. A photosensor 4 is positioned near illumination sources LED 1, LED 2 and LED 3. A control circuit 5 is programmed so that the LEDs 1-3 are sequentially turned on and off, and the obtained measurements, which are detected by the photosensor 4, are analyzed. Data captured by the photosensor 4 is stored separately for each LED. For example, the detected data corresponding to each light-up of LED 1 is stored in an LED 1 register, the detected data corresponding to each light-up of LED 2 is stored in an LED 2 register, and the detected data lighted each time of LED 3 are stored in an LED 3 register. The result is a time domain signal for each LED.
Fig. 2 shows an exemplary method for detecting a moving target using the sensor of Fig. 1. The movement is detected by observing the relative delay between sensed LED signals on the same axis. For example, to detect movement from left to right or right to left, the signals detected by the LEDs 1 and 2 are compared, as shown in FIG. LED 1 lights up at a different time than LED 2. LEDs 1 and 2 are positioned at known locations and are turned on and off in a known sequence. As the light from the LEDs hits a target moving across the LEDs, light from the moving target is reflected back to the photosensor 4. The detected reflected light is converted into a voltage signal, which is transmitted to the control circuit 5 (FIG. 1). The control circuit 5 includes an algorithm that uses the LED positions, the LED power-on sequences, and the received captured data to determine the relative movement of the target. The time interval between the lighting up of consecutive LEDs is rather small compared to the speed of the moving target and is therefore negligible when comparing the time domain signals from one LED to that from another LED.
[0007] FIG. 2 shows the time-domain detected voltage signals for both left-to-right and right-to-left motion. The curves labeled "Signal from LED 1" show the sensed voltage resulting from the repeated illumination of the LED 1. The lower part of each curve shows that the target is not over or near the LED 1 In other words, the target is not in the "field of view" or coverage of the photosensor 4, allowing light emitted by the LED 1 to be reflected away from the target and onto the photosensor 4. If the target is not in the field of view of the photosensor 4 with respect to the LED 1, the photosensor 4 detects no reflections of light emitted from the LED 1. The high portion of the curve shows that the target is in the field of view with respect to LED 1, indicating that that the target is moving above or near the LED 1. The curve marked "Signal from LED 2" shows the detected voltage resulting from the repeated illumination of the LED 2. The LED 1 and LED 2 light alternately on, so while LED 1 is on, LED 2 is off, and vice versa. While the target is positioned in the field of view corresponding to LED 1, but not in the field of view corresponding to LED 2, the detected voltage associated with LED 1 illumination is high, but the detected voltage is in connection with LED 2 illumination low. In simple terms, this corresponds to a target positioned above or near the LED 1. When the target is positioned in the middle, between the two LEDs 1 and 2, the photosensor 4 detects reflected light from the illumination of both the LED 1 and the LED 2, resulting in high detected voltage levels, both the LED 1 and the LED 2 correspond. When the target is positioned above or near the LED2, the detected voltage associated with the lighting of LED 2 is high, but the detected voltage associated with the lighting of the LED 1 is small. When the target is positioned neither above LED 1 nor above LED 2 or between LED 1 and LED 2, the photosensor 4 does not detect reflected light in association with any of them, and the corresponding detected voltage levels are small.
For left-to-right movement, the detected voltage level goes high in the case of "signal from LED 1" before the detected voltage level in the case of "signal from LED 2", as in the left-to-right movement signals of FIG In other words, the time course of the voltage, ie the "signal from LED 2", is delayed relative to the time course of the voltage of the "signal from LED 1" when the target is moving from left to right.
Fig. 2 also shows the detected voltage signals in the case of a movement from right to left. In this case, the detected voltage level for the "signal from LED 2" becomes high before the detected voltage level for the "signal from LED 1", as shown in the right-to-left motion signals on the right side of FIG. Here, therefore, the curve of the voltage of the "signal from LED 1" is delayed relative to the curve of the voltage of the "signal from LED 2" when the target moves from right to left.
An up and down movement, where "up" and "down" are considered to be a movement along the y-axis, is similarly determined using the LEDs 2 and 3 and the corresponding voltage / time data. The control circuit 5 receives the detected voltages from the photosensor 4 and determines the relative movement of the target in the y-axis in a similar manner as previously described with respect to the x-axis.
A disadvantage of the multiple illumination source configuration, e.g. LED1 and LED2 are also the many sources of illumination source that must be included in the device. With increasing device size reduction, additional components are undesirable.
The object of the invention is to provide a device for determining the movement of a target, with the safe determination of the target movement, in particular as regards the speed, apart from the direction of movement, is made possible.
The device according to the invention of the kind set forth in the opening paragraph is characterized in that the light-modifying structure has a plurality of layers configured in a step-shaped structure and configured to selectively block part of the light.
Advantageous embodiments and further development are specified in the dependent claims.
In the present device, a single photosensor or a group of photosensors or light sensors is provided, wherein a light-modifying structure forwards reflected light to the photosensor. The light or photosensors detect or detect the reflected light and emit corresponding detected voltage signals. A control circuit receives and processes the detected voltage signals to determine movement of the target relative to the photosensor. The control circuit includes an algorithm configured to calculate one or more differential analog signals using the sensed voltage signals output from the segmented photosensors. At this time, according to the calculated differential analog signals, a vector used for determining a direction and / or the speed of movement of the target can be determined.
The device can be used for detecting hand gestures. The direction of travel of the target passing the sensor may be determined by applying vector analysis to differential signals based on the sensor signals.
Similarly, a proportional value of a motion velocity of the target moving past the sensor may also be determined by applying vector analysis to the differential signal. The light captured by the segment may, on the one hand, be light from a source of illumination reflected from the target. On the other hand, the light detected by the segment may also be ambient light. The differential signals may be composite differential signals, wherein a composite signal is a signal formed by adding two or more segment signals.
The calculation of one or more differential signals may include calculating a first differential signal indicative of the direction of travel of the target along an x-axis. The first differential signal preferably includes a positive maximum value and a negative maximum value. The moving direction of the target may be determined to be one in the positive x-direction when the positive maximum value temporally precedes the negative maximum value, and the moving direction of the target may be determined to be in a negative x-direction when the negative maximum value is the positive maximum value chronologically preceded. The calculation of one or more differential signals may include calculating a second differential signal indicative of the direction of travel of the target along a y-axis. This second differential signal may also include a positive maximum value and a negative maximum value. The direction of movement of the target may be determined to proceed in a positive y direction if the positive maximum value precedes the negative maximum value, and may be determined to proceed in a negative y direction if the negative maximum value temporally precedes the positive maximum value.
A proportional value of a moving speed of the target along the x-axis can be obtained by using the time difference between successive zero crossings of the first differential signal and a proportional value of a moving speed of the target along the y-axis using a time difference between successive zero crossings of the be calculated second differential signal. The proportional values of the velocity of movement of the target along the x-axis and along the y-axis may be superimposed to form a target vector. Further, determining a predefined set of directions according to the target vector is possible. The predefined set of directions may include a positive x-direction, a negative x-direction, a positive y-direction, and a negative y-direction. The target vector may have a target vector angle, and determining a predefined set of directions may include comparing the target vector angle with a set of defined threshold angles. Also, determining the predefined set of directions may include comparing the target vector with a set of predefined distribution patterns, each distribution pattern corresponding to one of the directions in the predefined set of directions. Here, comparing the target vector may include determining a confidence score associated with a comparison of the target vector with each distribution pattern, and selecting the predefined set of directions according to the highest confidence score.
The present device may include a multi-segment segmented sensor each outputting a segment signal corresponding to the light detected by the segment; a memory configured to store the segment signals; and a processor connected to the memory. The processor includes program instructions configured to calculate one or more differential signals in accordance with the segment signals output from the plurality of segments; and determining a target travel direction of a target moving past the segmented sensor by applying vector analysis to the one or more differential signals.
Hereinafter, various embodiments of devices for determining the movement of targets will be explained with reference to the drawings; In the drawing: Fig. 1 is a simplified block diagram of a conventional gesture sensor; Fig. 2 shows schematically an example for detecting a moving one
Target using the gesture sensor of Fig. 1; FIG. 3 is another diagram of a gesture detection apparatus; FIG. Figures 4 and 5 are exemplary composite signals generated from signals output from a segmented photosensor in response to a target moving in different directions; Fig. 6 is a cross-sectional view of a device according to the invention in accordance with a "sundial configuration" Fig. 7 is a plan view of the cell of Fig. 6. Fig. 8 shows the cell of Fig. 7 Turned 90 ° Fig. 9 is a plan view of several cells used to form four
Sensor segments are configured; FIG. 10 is a cross-sectional view of an alternative sundial configuration; FIG. FIG. 11 is a cross-sectional view of another sundial configuration; FIG. Fig. 12 is a cross-sectional view of a "pinhole configuration" of a
Device according to the invention; Fig. 13 is a schematic plan view of the cell of Fig. 12; Fig. 14 is a cross-sectional view of a "roof configuration"; Fig. 15 is a plan view of a "quad-corner configuration"; Fig. 16 is a schematic cross-sectional view of the quadruple corner configuration of Fig. 15; FIG. 17 illustrates an exemplary implementation of step walls used in the blind configuration; FIG. Fig. 18 adjacent cells in such a blind configuration.
FIG. 19 is a plan view of a sensor quad-cell configuration; FIG. FIG. 20 is an exemplary waveform corresponding to image movement from left to right across the segmented sensor of FIG. 3; FIG. Figures 21 to 27 further exemplary waveforms according to different
Image movements over the segmented sensor; Fig. 28 shows four Gaussian distributions corresponding to detected directions of travel "left", "right", "up" and "down"; and Figure 29 shows an exemplary 4x4 array of photodiode segments.
The present device may include a single light source and a multi-segmented single photosensor or a group of photosensors. By adding a light modifying structure such as an optical lens structure or a mechanical structure, light reflected from a near target such as a hand or a finger can be focused on different segments of the photosensor depending on the target position relative to the segmented photosensor become. The various segments of the photosensor capture the reflected light, and the relative amplitudes of the segments indicate the movement of the target. A control circuit receives and processes the data sensed by the segmented photosensor to determine target movement relative to the segmented photosensor. The one-light sensor configuration is comparatively compact and inexpensive. A user may forward a command via the device by gesturing without activating touchscreen control or having to operate mechanical buttons. This offers significant performance and cost savings.
Fig. 3 shows a schematic diagram of a gesture detection apparatus 10. This apparatus 10 includes a single illumination source, e.g. an LED 11, and a segmented photosensor 12. The segmented photosensor 12 may be configured to detect only a specific light wavelength or specific wavelengths, such as the wavelengths radiated from the illumination source or LED 11. Such a configuration can be implemented by the use of a filter. The segmented photosensor 12 may be either a single sensor functionally divided into multiple segments, or a group of individual photosensors. Thus, a four-segment segmented photosensor is functionally equivalent to four individual photosensors arranged in a quad layout. As used herein, a reference to a "segment" refers to either a segment, a part, within a single sensor, or a single sensor in a group of sensors. FIG. 3 shows the segmented photosensor 12 in both an (upper) side view (as well as in a (lower) plan view to show the various segments A, B, C and D. FIG.
Referring to Figure 3, the segmented photosensor 12 includes four segments A, B, C, and D. Although a four-segment detector is the simplest implementation, a larger number of segments may be provided to increase the resolution of the system , The signal processing electronics become increasingly complex with an increase in the number of segments. The segments are each isolated from each other. The LED 11 is positioned near the segmented photosensor 12. As a target moves near the LED 11 and into a corresponding field of view of the segmented photosensor 12, light emanating from the LED 11 is reflected away from the moving target and reflected off the segmented photosensor 12. The device 10 also includes as the light modifying structure an optical lens structure 13 for focusing light on the segmented photosensor 12. The focusing lens focuses reflected light from a moving target, such as a hand, in the space above the segmented photosensor 12. Only reflected light that is in the "field of view" is focused on the segmented photosensor 12. Although the optical lens structure 13 is schematically illustrated as a single lens element 13 in FIG. 3, any number of lenses and / or optical elements directing light onto the segmented photosensor 12 may be provided. An exemplary implementation of an optical lens structure and / or light sensor is described in co-pending US Patent Application No. 61 / 490,568, issued May 26, 2011, entitled "Light Sensor Having Glass Substrates With Lens Formed Therein", and co-pending US Pat Patent Application No. 61 / 491,805 dated May 31, 2011 entitled "Light Sensor Having Glass Substrate With Lens Formed Therein". Each segment (A, B, C and D) of the segmented photosensor 12 outputs a segment signal to a control circuit 14 where the segment signals are processed.
The LED 11 is continuously or periodically energized to illuminate the target. The light reflected from the target triggers the segment signal on each of the segmented photosensors. These segment signals are processed and stored in a buffer, the buffer being integrated with the control circuit 14 or provided separately therefrom. The control circuit 14 analyzes the stored data and determines whether a valid gesture has been detected. The same data can also be used so that the segmented photosensor 12 operates as a proximity detector. The same photosensor structure may be used with another signal processing circuit, so that the device 10 may be e.g. serves as an ambient light sensor.
When the LED 11 is turned on or lights up, the target is illuminated when the target is in a near space above the segmented photosensor 12. The moving target is exemplified in Fig. 3 as a flat reflector. The reflection of the target is imaged by the optical lens structure 13 on the segmented photosensor 12. The example of Figure 3 shows a movement of the target from right to left. As the edge of the target moves through the center of the imaging zone, the focused image of the target edge moves over the segmented photosensor 12. Segments A and C are the first to respond to the moving image, followed by segments B and D, cf. , Fig. 4. The control circuit 14 is programmed to detect this sequence of events and detects a target movement from right to left. Likewise, target motion from left to right may be recognized by an opposite sequence, and both top to bottom and bottom to top target motions may be detected using the orthogonal set of signals. A target movement from inside to outside can be detected by detecting the absolute amplitude of the sum of the four segments A-D, which is also the approximate measurement.
Figures 4 and 5 show exemplary composite signals generated from signals output from the segmented photosensor 12 in response to a target moving in different directions. A composite signal is composed of two or more segment signals, each segment signal providing data about the sensed voltage over time. The composite signals and the composite signal analysis examples shown in Figs. 4 and 5 show an exemplary procedure for analyzing the segment signals to determine target motion. Of course, alternative analysis techniques may be used on the segment signals to determine relative target motion.
4, to determine whether a target is moving from right to left or from left to right, the segment signals from segment A and segment C are added together to form a composite signal A + C and the segment signals Segment B and segment D are added together to form a composite signal B + D. FIG. 4 shows exemplary composite signals corresponding to the determination of right-to-left or left-to-right movement of the target. The composite signal B + D is subtracted from the composite signal A + C to form a differential composite signal (A + C) - (B + D). When the movement is from right to left, the differential composite signal (A + C) - (B + D) has a positive peak followed by a negative peak as shown in the lower left graph of FIG. When the movement is from left to right, the differential composite signal (A + C) - (B + D) has a negative peak followed by a positive peak as shown in the lower right graph of FIG.
In Fig. 3 it can be seen that the direction of movement of the target is opposite to the direction of movement of the image on the segmented photosensor 12. Image inversion is a result of the optical lens structure 13. In alternative embodiments, described in detail below, the optical lens structure 13 is replaced by one of a number of mechanical structures. In some embodiments of these alternative configurations, the image on the segmented photosensor 12 moves in the same direction as the target, and the composite signals (A + C) and (B + D) shown in FIG. 4 are swapped and the differential composite signal (A + C) - (B + D) is reversed. As the target moves from right to left, as shown in Fig. 3, the image on the segmented photosensor 12 moves from left to right. When the target is moving from right to left, as in Fig. 4, the image first appears on segments A and C because the target is on the right, but the image does not yet appear on the segments B and D, and the resulting composite signal A + C begins to increase as shown in the upper left hand graph of Fig. 4, whereas the composite signal B + D remains at 0. When the target then moves to the left, the image begins to appear on segments B + D while also appearing on the segments A + C, and the resulting composite signal B + D begins to increase as in the middle left curve of FIG Fig. 4 is shown. Finally, the picture appears completely on all segments A-D. When the trailing edge of the target image moves away from the segments A and C, the composite signal A + C returns to 0, and the negative peak of the differential composite signal (A + C) - (B + D) is formed.
Similarly, as the target moves from left to right, the image initially appears on segments B and D while the target is on the left, but the image does not yet appear on segments A and C, and the obtained composite signal B + D starts to increase as shown in the upper right-hand graph of Fig. 4, but the composite signal A + C remains at 0. When the target moves to the right, the image starts on the segments B + D, and the obtained composite signal A + C starts to increase as shown in the middle right-hand graph of FIG. 4. Finally, the picture appears completely on all segments A-D. As the trailing edge of the target image moves away from segments B and D, the composite signal B + D returns to 0, and the positive peak of the differential composite signal (A + C) - (B + D) is formed.
An up and down movement is similarly determined. To determine whether a target is moving from top to bottom or bottom to top, the segment signals from segment A and segment B are added together to form a composite signal A + B, and the segment signals from segment C and segment D are added together to form a composite signal C + D. Fig. 5 shows exemplary composite signals corresponding to the determination of top to bottom or bottom to top movement of the target. The composite signal C + D is subtracted from the composite signal A + B to form a differential composite signal (A + B) - (C + D). In a bottom-up motion, the differential composite signal (A + B) - (C + D) has a positive peak followed by a negative peak, as shown in the lower left graph of FIG. In a top-to-bottom movement, the differential composite signal (A + B) - (C + D) has a negative peak followed by a positive peak as shown in the lower right-hand graph of FIG.
When the target moves from the top to the bottom, the image first appears on the segments A and B, but not yet on the segments C and D. The resulting composite signal A + B begins to increase as in the upper left curve 5, but the composite signal C + D remains at 0. When the target moves down, the image also begins to appear on the segments C + D while still appearing on the segments A + B, and the resulting composite signal C + D begins to increase as shown in the middle left curve of FIG. Finally, the picture appears completely on all segments A-D. As with the movement from right to left, in the bottom to top motion, the differential composite signal (A + B) - (C + D) has a positive peak followed by a negative peak as in the lower left graph of FIG 5 is shown. As can easily be seen, the opposite movement, from top to bottom, forms a similar differential composite signal (A + B) - (C + D), but with the opposite phase, as shown in the lower right hand curve of FIG ,
Additional processing is performed to determine movement toward and away from the segmented photosensor 12, referred to as inside-out movement. To determine the movement from inside to outside, all four segments A, B, C, D are added to form a composite signal A + B + C + D. When the composite signal A + B + C + D increases over a certain period of time, it is determined that movement toward the segmented photosensor 12 is toward or inward. When the composite signal A + B + C + D decreases over a certain period of time, it is determined that movement is taking place away from the segmented photosensor 12 or out.
In general, the segments A, B, C, D are measured and the segment signals suitably processed to determine changes in the size of the composite signals. These changes, when compared in time to changes in the size of other composite signals, determine the relative movement of a target that reflects light back to the segmented photosensor 12.
Mechanical structures can now be used instead of the optical lens structure 13. Mechanical structures are used to influence how the reflected light is directed to the segmented photosensor 12. A first mechanical structure is referred to as a "sundial configuration." Such a sundial configuration implements a physical "wall" protruding from a sensor surface of the segmented photosensor 12. The wall effectively casts a "shadow" on various sensor segments as the target passes through the space is moved over the segmented photosensor 12. This shadow is tracked and the target movement is determined accordingly.
Fig. 6 shows an example of a cross-sectional view of such a sundial configuration. The sundial configuration provides a mechanical means for directing reflected light to a photosensor, in this case a photodiode. The central structure is the physical sundial wall, which is used to block reflected light. Referring to Fig. 6, two N-EPI to P-SUBSTRAT transitions form two photodiodes on each side of the wall. The wall is a series of metal layers built to separate the two photodiodes. In the exemplary configuration of FIG. 6, the wall includes a first metal layer M1, a second metal layer M2, a third metal layer M3, and an upper metal layer TM. Each metal layer is separated by a passivation layer, such as silicon dioxide, in which contact holes (vias) are formed. The metal layers, passivation layers and contact holes are formed by conventional semiconductor processing techniques. The wall is formed on a substrate doped to form the photodiodes, also referred to as a cell. The first photodiode, or photodiode cell A, is formed by an N-EPI / P-SUBSTRAT transition. A metal contact M1 is coupled to the N-EPI region to make contact with the cathode of the photodiode cell A. The P-SUBSTRATE serves as a photodiode anode and is common to both photodiode cells of the A cells and a B cell. There is an additional photodiode formed by adding a P-well layer to the top of the N-EPI layer of photodiode cell A. A contact for P-TANK is formed at the end of the P-TANK (not shown in Fig. 6). The P-TUNNED photodiode can be used to measure ambient light when the gesture function is not used. Such configuration and functionality is described in US Patent Application Number 12 / 889,335, filed September 23, 2010, entitled "Double Layer Photodiodes in Ambient Light Sensors and Proximity Detectors." The second photodiode or photodiode cell B is formed in an identical manner as the photodiode cell A. The two photodiode cells A and B are separated by two P + diffusions extending through the N-EPI region and in contact with the P-SUBSTRAT. An island of N-EPI is formed between the two P + isolation diffusions. This island forms an additional diode which collects any scattered photocurrent that could migrate from beneath the photodiode cell A and otherwise be collected by the photodiode cell B. This additional diode also collects any scattered photocurrent that could migrate from beneath the photodiode cell B and otherwise be collected by the photodiode cell A. Together, the two Ρ-isolation insulations and the N-EPI island interpose the A / B isolation region. The three elements of the A / B isolation region are all shorted by the first metal layer M1 connected to ground on the top metal layer TM. Each photocurrent collected in the composite A / B isolation region is dissipated to ground, thereby preventing interference between the photodiode cell A and the photodiode cell B.
The structure in Fig. 6 is a cell containing the photodiode cell A, the photodiode cell B, the isolation region and the wall. Fig. 7 shows a top-down view of the cell of Fig. 6. This cell is configured to determine a left-to-right movement, as the wall is oriented perpendicular to the left-to-right direction of movement to be determined. To determine an upward / downward movement, the cell is rotated by 90 °, as shown in Fig. 8. In the cell configuration of Fig. 8, the wall structure is oriented perpendicular to the upward / downward movement to be determined. One reason for creating cells is that the size of the photodiode cells is limited, especially the width of the photodiode cell extending away from the wall structure. This limits the surface that can be used to measure the reflected light. Fig. 9 exemplifies a top-down view of a plurality of cells configured to form four blocks. Each cell is isolated from an adjacent cell by an isolation region I. In Fig. 9, block 1 consists of a group of alternating photodiode cells A and B. Block 1 is identical to block 4, which also contains a group of alternating photodiode cells A and B. All photodiode cells A in the two blocks 1 and 4 are shorted to form a pooled A node. Pooling the group of cells increases signal strength. Likewise, all the photodiode cells B in both blocks 1 and 4 are combined to form a single B-node. The same connection scheme is used to form a C-node and a D-node from the group of alternating photodiode cells C and D in blocks 2 and 3. The photodiode cells in blocks 2 and 3 are rotated 90 degrees relative to the photodiode cells in blocks 1 and 4. In this way there are four separate signals, one from each of nodes A, B, C and D.
The movement of the target in the left-right and up-and-down directions is again determined by analysis of the differential signals. For determining the target movement in the left-right direction, the differential signal A-B is formed. The differential signal A-B is analyzed in the same way as the differential composite signal (A + C) - (B + D) related to the quad-cell configuration of FIG. For determining the target movement in the up / down direction, the differential signal C-D is formed. The differential signal C-D is analyzed in the same way as the differential composite signal (A + B) - (C + D) relating to the quad-cell configuration of FIG.
The cell structure shown in Fig. 6 is an exemplary "sundial configuration" with alternative structures being possible Fig. 10 is a cross-sectional view of a modified sundial configuration.
As shown in Figure 10, the wall is alternatively formed and the underlying substrate is alternatively doped. In this case, the isolation region between the two photodiode cells A and B consists of a single P-type diffusion. The smaller isolation region of Fig. 10 allows for increased packing density compared to that of Fig. 6. Contacts of the P-TANK and the N-EPI areas are formed at the end of the group (not shown in FIG. 10). The P-i region in the substrate is connected to ground at the upper metal layer TM.
11 shows a cross-sectional view of an alternative sundial configuration. In the
In the configuration of Figure 11, the wall is alternatively formed and the underlying substrate is alternatively doped. The photodiode cells do not contain a P-well in this configuration. The contacts of the N-EPI region are formed at the end of the group (not shown in FIG. 11). The P + isolation region between the photodiode cells A and B is connected to ground at the top metal layer TM. In this embodiment, the absence of the P-well layer allows the production of narrower photodiode cells A and B compared to those of Figure 6. This structure allows for a higher packing density compared to that of Figure 6.
Another mechanical structure is referred to as a "pinstrip configuration." Figure 12 shows a cross-sectional view of such a pinstrip configuration The pinstrip configuration provides a mechanical means for directing a reflected light to a photosensor, in this case a photodiode analogous to a pinhole camera where the hole has been extended into a stripe or slot The two N-EPI sections in the substrate form the cathodes of photodiode cells A and B, the P-SUBSTRATE forming the common anode The metal layer M3 is formed over an interlayer dielectric, such as silicon dioxide, which is optically transparent The metal layer M3 and the open slot are formed by conventional semiconductor fabrication processes the tent lstruktur formed by conventional digital CMOS semiconductor manufacturing processes. Fig. 13 shows a plan view of the cell of Fig. 12 from top to bottom. As shown in Figure 13, the open slot is aligned along a length of the cell. The open slot may run the entire length or a partial length of the cell.
In operation, reflected light passes through the open slot and is incident on the photodiodes, the N-EPI regions. When a target position is at the right side of the open slot, light reflected from the target passes through the open slot and falls on the photodiode cell A at the left side. As the target moves from right to left, more reflected light falls on the left side photodiode cell A until the target passes a critical angle where less reflected light falls on the left photodiode cell A and instead begins reflected light onto the photodiode cell B to fall on the right side. When the target is directly over the slot at a cross point, the signals received by the photodiode cells A and B are the same. This is the position of the highest total signal strength and is also the point where the difference between the two signals A-B is zero. As the target moves further to the left, more reflected light falls on the right side photodiode cell B, the difference signal A-B changes sign and becomes negative. After further movement of the target to the left, zero reflected light is incident on the photodiode cell A on the left side. Similar to the sundial configuration, several cells of the pinhole configuration are positioned side by side to form a block, and the signals from the individual photodiode cells A are combined to form the common A-node. The same kind of signal aggregation is used for the signals B to D. The orientation of the open slot determines the direction of the target movement to be determined. For example, the horizontal orientation of the open slot in Figure 13 is used to determine an up / down motion. Multiple cells, aligned like the cell in Figure 13, form a segment configured to measure the up / down motion. The vertical orientation of the open slot is used to determine the left-right motion. In an exemplary configuration, the segments are aligned with the pinstrip configuration in a similar manner as the segments with the sundial configuration, as illustrated in FIG. 9, where segments A and D are configured to determine left-to-right motion and segments B and C are configured to determine an up / down movement. The target movement in the left-right and up-and-down directions is determined using the difference signals in the same manner as in the sundial configuration described above.
In alternative configurations, the metal layer M3 and the open slot may be replaced by any type of light-blocking element that allows light to enter through a defined area and block light at other locations, such as MEMS (micro-electro-mechanical systems ) Device or other levered or partially suspended element, wherein the covering element is carried by an optically transparent material or suspended by air near the open slot. A MEMS device is a very small mechanical device that is electrically driven.
One possibility is the use of the pinstrip concept in the quad-cell design for making a micro quadruple cell. 19 shows a top-down view of an exemplary micro-quad cell configuration. The micro quadruple cell consists of a group of small quad cells. All individual A segments are aggregated to form a single A signal, and the same applies to the B, C, and D segments. The group of quadruple cells is covered by a metal layer having square or round holes ("holes") that transmit light The metal layer is formed using a semiconductor process in a manner similar to that described for the pinstrip concept Quadruple cells A to D, the metal layer spacing, and the dimension of the aperture in the metal layer are consistent with the dimensions commonly available in semiconductor processes The apertures in the metal layer are positioned so that when light is directly above the aperture, all cells When the angle of the light changes, the relative illumination of the four cells becomes unbalanced, and the four signals A through D are processed in the same manner as previously described for FIG.
Another mechanical structure is referred to as a roof configuration. The roof configuration works much like the pinstrip configuration, except that reflected light does not enter the photodiodes of a cell through an open slot in the center of the cell structure, as in the needle hole configuration, but the center of the cell structure is covered by a "roof" and the peripheral sides of the structure are open so that reflected light reaches the photodiodes of the cell. 14 shows a schematic cross-sectional view of an exemplary roof configuration. The roof configuration provides a mechanical means for directing reflected light to a photosensor, in this case a photodiode. The two N-EPI sections form the photodiode cells A and B. An upper metal layer TM forms a roof over the center of the cell structure, covering an inner portion of the photodiodes but not covering an outer portion. The upper metal layer TM is an upper layer of a wall formed as a series of metal layers M1, M2, M3 separating the two photodiode cells A and B. The wall structure is formed in a manner similar to the wall structures of the sundial configurations, except that the top metal layer TM of the roof configuration extends over portions of the two photodiode cells A and B. The portion of the upper metal layer TM that extends across the two photodiode cells A and B is formed over an interlayer dielectric (not shown), such as silicon dioxide, that is optically transparent. Similar to the pinstrip configuration and sundial configurations, there are multiple cells of the roof configuration positioned side by side to form a segment, and multiple segments are configured and oriented to determine left-right and up-down movement. Reflected light is detected by the photodiode cells A and B, and the detected voltage is collected and processed similarly to the above-described pin-striped configuration and sundial configuration.
Another mechanical structure is referred to as a quad-corner configuration. This quad-corner configuration is conceptually similar to the sundial configuration because it uses a physical wall located between light-sensing elements, but instead of implementing the silicon-wall wall and providing multiple cells for each segment, as in the sundial configuration, the quad-corner configuration is on the chip surface. Pack level implements where a wall is formed between the segments. FIG. 15 shows a top view of an exemplary quad-corner configuration. FIG. Fig. 16 shows a cross-sectional view of the four-corner configuration of Fig. 15. In the exemplary configuration illustrated in Figs. 15 and 16, photosensor segments A-D are formed as four photodiodes on an integrated circuit chip. The four photodiodes may be considered to be identical to the four photodiodes of Figure 3, except that instead of using the closely spaced quadruple geometry of Figure 3, the photodiodes are spaced instead and in the four corners of the substrate, as shown in Figure 16 are arranged. The integrated circuit chip is packaged in a chip package that includes a wall formed of optically opaque material that blocks light, such as light reflected from a moving target. The section of the chip package over the photodiodes consists of an optically transparent material. The height of the wall in the quad-corner configuration is high enough so that each segment is a single sensor element, as opposed to multiple cells, as in the sundial and roof configurations. The determination of the movement of the target is made in a manner similar to sundial configuration without having to assemble the individual cell voltages for a particular segment. The quad-corner configuration includes a wall that has a chip-pack size versus sundial configuration that includes a wall that is of transistor order.
Another mechanical structure is referred to as a "blind configuration." The blind configuration is similar to the sundial configuration except that the wall structure in each cell is formed at a non-perpendicular angle to the photodiode cell (s), as opposed to The angled walls are fabricated by forming metal layers and vias in a step configuration as shown in Figure 17. In addition, each cell in the Venetian blind configuration contains a single photodiode cell located on one side of the angled wall As shown in Figure 18, in the Venetian blind configuration, each of the four segments faces in a different 90 ° direction. For example, Segment A is configured with walls angled to the left, Segment B is configured with walls facing up Angled, segment C is configured with walls facing down n are angled, and segment D is configured with walls that are angled to the right. In other words, each segment has a different field of view. Using these orientations, a target movement in the left-right and up-and-down directions is determined using differential signals in the same manner as in the sundial configuration described above. Of course, other orientations may be used.
In some embodiments, filters on top of the photosensors are used to filter out light having wavelengths different from those of the illumination source.
The exemplary embodiments described a gesture detection device having four symmetrically configured segments or photosensors. It will be appreciated that the concepts described herein may be extended to more than four segments configured symmetrically or asymmetrically, such as in an NxN, NxM, circular or other shaped group of photo segments or sensors. As previously described, a "segment" refers to either a split segment in a single sensor or to a separate sensor, or a photodiode, in a group of sensors.
As described, the control circuit is configured to process the segment signals received from the segmented photosensor. In particular, the control circuit includes an algorithm designed to detect both the direction and speed of a gesture in two dimensions, e.g. a combination of left, right, up and down to get a "gesture vector". This can be extended to larger groups of photodiodes to allow the formation of vector fields, further increasing the accuracy of the algorithm. A vector may be used for command identification, subsequent processing, or other application-specific uses. The ability to track speed can increase the effective number of recognizable gestures by a factor of two when using only "slow" and "fast" or more, providing increased functionality. The raw vector data may be used to define predetermined gestures or the raw vector data may be converted to a probability that the vector corresponds to one of the four cardinal directions or another defined set of basic directions.
The algorithm also includes gesture recognition along the Z-axis, e.g. towards or away from the segmented photosensor. Also, the algorithm may include a "finger trace".
The algorithm will be explained in connection with the device of FIG. The LED 11 illuminates the target moving over the segmented sensor 12, causing light reflected from the target to fall on the segmented sensor 12. The light modification structure 13 conceptually represents any means that directs reflected light onto the segmented sensor 12, the means for steering including, but not limited to, the described optical means and mechanical means. The image formed on the segmented sensor moves in a shift relative to the target motion. From the segmented signals output from the four segments A, B, C, D, composite signals are derived. The motion is determined by adding and subtracting the segment signals taken in different combinations for the two axes x and y, where the x-axis corresponds to the left and right movement and the y-axis corresponds to the up and movement. The movement in the left and right direction is determined according to X = (A + C) - (B + D) and the movement in the up and down direction is determined according to Y = (A + B) - (C + D) certainly. The Z-axis motion toward or away from the segmented sensor is the total amount of light falling on all segments and is determined according to Z = A + B + C + D.
When an image moves from left to right across the segmented sensor, the composite signal X first increases from 0 to a certain positive value, then decreases below 0 to a certain negative value, before eventually returning to 0. If the movement is exclusively in the x-direction, the composite signal Y does not change very much, and if it does move in one direction only, since the segments are asymmetrically illuminated by the illumination source. The composite signal Z increases with the illumination, regardless of the direction of movement along the x-axis or y-axis.
The relationship between the direction of the target movement and the corresponding direction of image movement on the sensor depends on the light steering mechanism used to direct reflected light at the segmented sensor. Fig. 3 shows in the left half an exemplary target movement from right to left. As previously mentioned, this target motion is inversely detected as image motion on the segmented sensor 12. For a target movement from right to left, there is a corresponding image movement from left to right and vice versa on the sensor 12. Similarly, there is for a target movement from top to bottom, a corresponding image movement from bottom to top and vice versa. In the examples described above, there is an opposite relationship, with the direction of movement of the target being opposite to the direction of image movement. Alternative conditions are also considered.
Fig. 20 shows an exemplary waveform corresponding to the image movement from left to right across the segmented sensor 12 of Fig. 3. An image movement from left to right corresponds to a target movement from right to left. As the target moves from the far right to the segmented sensor 12, an image finally appears on the segments A and C to appear. As the target continues to move from right to left, an increasing portion of the target is imaged on segments A and C, resulting in an increasing X value. At some point, a maximum image is detected on the segments A and C corresponding to the point immediately before the image hits the segments B and D. This point corresponds to a maximum X value, which is shown as an example in FIG. 20 as the positive vertex of the sine waveform. As the target moves further to the left, the image moves farther to the right and begins to fall to segments B and D. In the formula for calculating the value X, a positive value for B + D is subtracted from A + C, resulting in a decreasing value of X. Finally, the target moves to the left to a point where half an image strikes segments A and C and the other half image strikes segments B and D, which corresponds to the center zero crossing in FIG. As the target moves further to the left, the image moves farther to the right, falling more and more to segments B and D, and less and less to segments A and C, resulting in an increasingly negative value of X. Finally, the value of X reaches a negative maximum corresponding to the position of the target where the image no longer falls on the segments A and C and falls on the segments B and D to a maximum extent. As the target progressively moves to the left, fewer and fewer images fall on the segments B and D until the target reaches a position where no reflected light falls on any of the segments, which corresponds to the rightmost zero crossing in FIG.
Fig. 21 shows an exemplary waveform corresponding to up / down image movement across the segmented sensor 12 while the target movement is from right to left, as in Fig. 20. The exemplary waveforms shown in Figs 21 correspond to the target movement exclusively in the x direction. Ideally, in the case of a target movement in the x direction only, the Y value is 0. In practice, however, a certain value other than 0 is usually determined because the segmented sensor 12 is asymmetrically illuminated by the LED 11. The waveform shown in Fig. 21 shows a positive value other than 0, but is intended to represent a trivial value other than 0, which may be positive, negative, 0, or a combination over time.
Fig. 23 shows an exemplary waveform corresponding to an up / down image movement over the segmented sensor 12 of Fig. 3. An image movement from top to bottom corresponds to a target movement from bottom to top. The waveform shown in Fig. 23 corresponds to the composite signal Y and is determined similarly to the waveform corresponding to the composite signal X shown in Fig. 20. The positive values of Y correspond to the reflected light falling exclusively or predominantly on the segments A and B, and the negative values of Y correspond to the reflected light falling exclusively or predominantly on the segments C and D. The zero crossings correspond either to a null image falling on the segments A, B, C and D or to an equal extent of an image falling on the segments A + B as well as the segments C + D.
FIG. 22 shows an exemplary waveform corresponding to image movement from left to right across the segmented sensor 12 while the target motion is from bottom to top, as in FIG. 23. The exemplary waveforms shown in FIGS. 22 and 23 are, correspond to a target movement exclusively in the y direction. Ideally, the X value for a target motion is 0 only in the y direction. However, in practice, a certain value other than 0 will usually result because the segmented sensor 12 is asymmetrically illuminated by the LED 11. The waveform shown in Fig. 22 shows a positive value other than 0, but is intended to represent a trivial value other than 0, which may be positive, negative, 0, or a combination over time.
In order to determine a gesture in the z-direction, it is necessary to search for a sufficient increase in the Z- or VSUM- signal (A + B + C + D) without there being a vector which is present in the x-direction. or y-direction is detected.
Referring to Figs. 20 and 23, the positive and negative zero crossings coincide with the image moving from one side of the segmented sensor 12 to the other. Therefore, the faster the target moves, the faster the image crosses from one side of the segmented sensor 12 to the other and, therefore, causes the zero crossings of the waveform to be more closely spaced in time. This correlates exactly with the speed. Figures 24-27 show waveforms similar to the respective waveforms of Figures 20-23, except that the target motion corresponding to the waveforms in Figures 24-27 is faster than the target motion corresponding to the waveforms in Figures 20-23. 23 corresponds. The waveforms in Figures 24-27 are analogous to the waveforms in Figures 20-23, respectively. The waveforms that correspond to faster target motion, such as the waveforms shown in Figures 24-27, have a shorter period or are compressed compared to the waveforms that correspond to similar but slower target motion, such as the waveforms shown in FIG Figures 20-23 are shown.
The reflected light is scanned at a predetermined rate, e.g. once every millisecond. At time 0, the X value starts to become positive as shown in FIG. At a later time, such as 30 milliseconds, the X value crosses the zero value and becomes negative. Dividing the sampling rate by the time between zero crossings gives a value proportional to the velocity. This is a raw estimation of the target velocity since there are other contributing factors such as the distance of the target from the sensor, but this estimate provides an exact relative velocity compared to the other direction, for example a relative velocity in the x direction compared to y direction, since the estimated velocity in both the x and y directions can be calculated using the respective zero crossings and then subsequently compared. One exemplary application is the use of the estimated speed determination as a heading level command, where various commands are determined based on a different estimated speed. For example, rotation of a displayed object may be commanded at a faster rate if the determined estimated speed is greater than a threshold, at a medium rate if the determined estimated speed is between the high threshold and a low threshold, or at a slow rate if the determined estimated speed is less than the lower threshold.
The above examples relate to waveforms resulting from gestures or a target motion exclusively in the x or y direction. However, many gestures could contain components in both directions, such as a diagonal target motion, and the corresponding waveform amplitudes may vary from case to case. Therefore, it is reasonable to pay attention to a relative change between positive and negative, in particular zero-crossings, and this in particular for both the left-right and up-down channels simultaneously. If the aiming motion is not exclusively left-right or up-down, the resulting waveforms of the X and Y signals may vary in both amplitude and period.
By using the information obtained in the composite signal X and the composite signal Y, a two-dimensional vector can be determined. When specifying that a zero crossing in one direction must be followed by a zero crossing in the opposite direction to identify a gesture on either the left-right or up-down channels, and the first zero-crossing at a time t1 and second zero crossing occurs at time t2, the velocity along either the x or the y direction is proportional to 1 / (t2-11). The direction is determined depending on whether the first zero crossing is negative or positive. When this is done for both left-right and up-down channels, the velocity Vx in the x-direction and the velocity Vy in the y-direction may become a two-dimensional vector in the form Vxi + Vyj below Use Cartesian coordinates are superimposed. The Cartesian coordinates are immediately converted into polar coordinates containing a vector angle. The result is that the target movement can be detected along any angle and at any velocity in the x, y plane, limited only by the sampling rate. The larger the sampling rate, the finer the resolution of the vector angle. If e.g. the speed Vx is greater than the speed Vy, it can be determined that the target is moving more in a left-right direction than an up-down direction.
Various angle thresholds can be defined and the vector angle compared with the angle threshold values. For example, a vector angle between + 45 ° and + 135 ° is determined as the upward motion of the target, and a vector angle between + 45 ° and -45 ° is determined to be the target to the right. The algorithm may also be weighted asymmetrically. For example, a vector angle of 60 ° may still be determined as the target to the right, although the vector is more than 90 °, which corresponds to the upward movement of the target. This example shows the general concept that the algorithm can be programmed to be e.g. taking into account earlier gesture distributions, which may be uniform or non-uniform.
This concept can be extended using vectors with a set of probability probabilistic functions to represent the confidence that a target movement will occur in a particular defined direction. In this way, the user must, e.g. do not make an exact gesture so that the gesture is recognized as one of the target's defined directions of movement, such as left, right, up, and down. This can also compensate for some noise that might have been introduced. For example, if the user wants to see only left-to-right, top-to-bottom, right-to-left, and bottom-to-top directions, four probability functions, such as Gaussian distributions, can be defined, with maxima centered on each desired vector and a half-maximum lies exactly half way (radial) between the adjacent desired vectors. Fig. 28 shows four Gaussian distributions corresponding to the detected directions left, right, up and down. In this example, the maxima appear at 0 ° (right), + 90 ° (up), -90 ° (down), and 180 ° (left), and the half maxima appear at ± 45 ° and ± 135 °. It is equally likely that each direction will occur. For a given vector, the vector angle is determined with respect to 0 ° (positive x direction), and the probability that the vector corresponds to all four probability distributions is calculated. The largest of these values is thus "most likely" and is determined to be the target movement Two example vectors are shown in Figure 28 and each vector corresponds to a measured target movement Vector 1 is determined as a left to right 90% confidence movement. Vector 2 is determined to be ambiguous from the bottom to the bottom and from right to left, since this vector is equally likely to go in the left circle and the down circle, but the algorithm can be programmed to obtain a predefined result in the case of any such ambiguity however, the algorithm may also be programmed so that it does not respond to an ambiguous result or generate an error message or indication.
As described above, the algorithm is applied to a four-segment sensor. The segmented sensor and algorithm are adaptable in the case of a sensor with more than four segments, e.g. A NxN or NxM group of segments. Fig. 29 shows an exemplary 4x4 group of photodiode segments. A vector can be determined for each of nine different four-segment arrangements. For example, a first four-segment array of segments Nos. 1, 2, 5, and 6 includes a second four-segment array containing segments Nos. 6, 7, 10, and 11 that includes a third four-segment array Segments Nos. 11, 12, 15, and 16, etc. By applying the algorithm to each of the nine four-segment arrays, a vector field can be assembled that can be used to obtain more complex target motion information.
The device 10 has been described in connection with a source of illumination, such as the LED 11 in FIG. As the illumination source, a plurality of light sources that are pulsed at the same time may be used, as opposed to a plurality of illumination sources that are pulsed serially, as in the conventional device of Fig. 1. By using a plurality of illumination sources that are pulsed simultaneously, a wider coverage area can be achieved become. The coverage area of a particular illumination source is defined as the area above the illumination source where light reflected from a target located within the coverage area strikes the sensor. The coverage area is consistent with the field of view of the segmented sensor. Although light from the illumination source may strike the target at areas outside the coverage area, the reflected light is only deflected when the target is within the coverage area so that it falls on the segmented sensor. Outside the coverage area, reflected light is not deflected correctly to fall onto the segmented sensor. Multiple sources of illumination pulsed simultaneously increase the coverage area.
More than one illumination source may also be used in conjunction with the segmented sensor if the illumination sources are not pulsed simultaneously. In this way, multiple x-channels and multiple y-channels may be implemented, a first x-channel and a first y-channel corresponding to a first illumination source, etc.
The apparatus 10 and the algorithm may also be adapted for use without a source of illumination. Instead of capturing the image corresponding to the reflected light originating from an illumination source, the ambient light is detected and e.g. determines a decrease in ambient light resulting from a passing target. Thus, a passing target casts a shadow on the segmented sensor, detecting the shadow as a decrease in ambient light. The shadow in an ambient light configuration is inversely analogous to an image seen in a lighting source configuration. In the ambient light configuration, the polarity of the three composite signals X, Y and Z is reversed.
The device 10 and the algorithm can also be used for finger tracking. By analyzing the instantaneous values of the composite signals X and Y, a current position of the target, such as a finger, can be determined. If z. B. the value of the composite signal X is positive or a certain amount greater than a certain predetermined positive X threshold and the value of the composite signal Y is equal to 0 or a certain value approximately equal to 0, which is a certain Y threshold approximately equal to 0 is not exceeded, it is determined that a person's finger is positioned on the left side of the segmented sensor. Similarly, if the value of the composite signal X is equal to 0 or a certain value approximately equal to 0, that exceeds a certain X threshold approximately equal to 0, and the value of the composite signal Y becomes negative or some value greater than a certain one predefined Y negative threshold, determines that the person's finger is positioned under the sensor. When the value of the composite signal X is positive and the value of the composite signal Y is negative, it is determined that the person's finger is positioned near the lower left corner of the sensor. In this way, nine positions can be determined. Eight of the positions are around the perimeter, or are the four corners, "left," "right," "up," and "down." The ninth position is the center of the segmented sensor corresponding to the case that the value of the composite signal X and the value of the composite signal Y are both equal to 0, but the Z, or VSUM, signal (A + B + C + D) is not equal to 0.
Tracking of successive finger positions also determines a vector. For example, three consecutive finger position images, which are to the left of the sensor, in the center of the sensor and to the right of the sensor, indicate a target movement from right to left. In this way, a finger trace that results in a vector determination is a more complex method of determining a target motion vector. Finger tracking can also be used for simpler applications, such as a single finger position, rather than a succession of successive finger positions indicating a command.
The invention has been described above on the basis of specific embodiments, which contain details for a better understanding of the design and operating principles of the device 10. Many of the components described in the various embodiments may be interchanged. Accordingly, modifications or modifications are possible without departing from the scope of the invention.
权利要求:
Claims (15)
[1]
claims
A device for determining the movement of a target, comprising at least one light sensor (12) and a light modification structure for relaying light reflected from the target to the at least one light sensor (12), characterized in that the light modification structure comprises a plurality of layers arranged in a stepped fashion Structure configured and configured to selectively block a portion of the light.
[2]
2. Device according to claim 1, arranged for determining a physical gesture.
[3]
3. Apparatus according to claim 1 or 2, characterized by a lighting source (11).
[4]
4. Apparatus according to claim 3, characterized in that the illumination source (11) has a light emitting diode (11).
[5]
5. Device according to one of claims 1 to 4, characterized in that the at least one light sensor (12) comprises a photodiode.
[6]
6. Device according to one of claims 1 to 4, characterized by an array of individual light sensors (12).
[7]
7. Device according to one of claims 1 to 4, characterized by a single light sensor (12) which is divided into several segments.
[8]
8. Device according to one of claims 1 to 5 or 7, characterized in that the at least one light sensor (12) contains a plurality of cell structures, each containing two photodiodes, and the Lichtmodifierungsstruktur contains a plurality of wall structures, a wall structure per cell, wherein the wall structure between the two photodiodes is positioned.
[9]
A device according to claim 8, characterized in that an upper layer of each wall structure has an outer periphery which does not overlap any of the two photodiodes.
[10]
A device according to claim 8, characterized in that an upper layer of each wall structure has an outer periphery which partially covers each of the two photodiodes.
[11]
A device according to any one of claims 1 to 10, characterized in that the light modifying structure comprises a plurality of metal layers (M1, M2, M3) and a plurality of dielectric layers separating the metal layers, each dielectric layer containing a plurality of metal vias connected to metal layers each side of the dielectric layer are connected.
[12]
12. The device according to claim 11, characterized in that the light-modifying structure is perpendicular to an upper surface of the at least one light sensor.
[13]
13. Device according to one of claims 1 to 5 or 7, characterized in that the at least one light sensor comprises a plurality of cell structures, each containing one or more photodiodes, and the light modification structure comprises a plurality of wall structures, one wall structure per cell, wherein the wall structure in a non-perpendicular angle to an upper surface of the one or more photodiodes.
[14]
14. Device according to one of claims 1 to 7, characterized in that a wall structure between two layers of the light-modifying structure is arranged.
[15]
15. The device according to claim 14, characterized in that the wall structure has a via contact hole. For this 14 sheets of drawings
类似技术:
公开号 | 公开日 | 专利标题
AT518427B1|2017-10-15|A method of detecting gestures using a multi-segment photodiode and one or less illumination sources
DE102012008954A1|2012-11-08|A method of detecting gestures using a multi-segment photodiode and one or less illumination sources
DE112013000590B4|2016-05-04|Improved contrast for object detection and characterization by optical imaging
DE10241392B4|2015-02-19|Apparatus and method for detecting a three-dimensional relative movement
DE112012005324T5|2014-10-02|Optical measuring sensor
DE102008016215B4|2017-08-24|Information device operating unit
AT512461B1|2018-02-15|DEVICE FOR ENTERING INFORMATION TO A DATA PROCESSING PLANT
DE112013005337T5|2015-07-16|Object detection and tracking with variable field lighting devices
DE102006041815A1|2007-03-29|Position detection system using laser granulation
DE112015004407T5|2017-06-29|Background light detection for optical navigation systems
DE102008058759A1|2009-05-28|System and method for exact lift detection of an input device
US10445896B1|2019-10-15|Systems and methods for determining object range
DE102014108310A1|2014-12-24|Optical runtime system
CN102981608B|2018-02-16|Use the method for the light source detection gesture of more segment photodiodes and one or less
DE102018119376A1|2020-02-13|Display to show optical information
DE112018006394T5|2020-08-20|CAPACITIVE MOTION SENSOR
US20140035812A1|2014-02-06|Gesture sensing device
DE60205736T2|2006-06-01|SYSTEM AND METHOD FOR THE OPTICAL IDENTIFICATION OF OBJECTS
DE102017204073A1|2018-09-13|TOF CAMERA, MOTOR VEHICLE, METHOD FOR MANUFACTURING A TOF CAMERA, AND METHOD FOR DETERMINING A DISTANCE TO AN OBJECT
DE60205233T2|2006-06-29|LASER-OPTICAL SENSOR SYSTEM FOR OBJECT DETECTION
Bertozzi et al.2002|Vision-based pedestrian detection: will ants help?
US10656275B1|2020-05-19|Remote sensing for detection and ranging of objects
DE60009996T2|2004-09-02|Detection of the position and movement of sub-pixel images
DE60213777T2|2007-08-16|DEVICE AND METHOD FOR THE OPTICAL RECORDING OF THE MOVEMENT OF OBJECTS
DE60112320T2|2006-06-29|SYSTEM AND METHOD FOR THE OPTICAL IDENTIFICATION OF OBJECTS
同族专利:
公开号 | 公开日
AT518427A5|2017-10-15|
US8716649B2|2014-05-06|
AT518427B1|2017-10-15|
US20120280107A1|2012-11-08|
US20140284462A1|2014-09-25|
US10429236B2|2019-10-01|
CN102880286A|2013-01-16|
CN102880286B|2018-02-06|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US5103085A|1990-09-05|1992-04-07|Zimmerman Thomas G|Photoelectric proximity detector and switch|
US20050179908A1|2004-02-12|2005-08-18|Sharp Kabushiki Kaisha|Optical movement information detector, movement information detection system, electronic equipment and encoder|
US20080006762A1|2005-09-30|2008-01-10|Fadell Anthony M|Integrated proximity sensor and light sensor|
US20090065821A1|2007-09-07|2009-03-12|Sun-Chan Lee|Image sensor and fabricating method thereof|
US20100078545A1|2008-09-26|2010-04-01|Avago Technologies Ecbu Ip Pte. Ltd.|Lensless user input device with optical interference|
DE19911419A1|1998-03-16|1999-10-14|Cyberoptics Corp|Area sensor for determining dimensions of object having varying profile and degree of reflection|
US6351576B1|1999-12-23|2002-02-26|Intel Corporation|Optical clocking distribution using diffractive metal mirrors and metal via waveguides|
US6624833B1|2000-04-17|2003-09-23|Lucent Technologies Inc.|Gesture-based input interface system with shadow detection|
US6618038B1|2000-06-02|2003-09-09|Hewlett-Packard Development Company, Lp.|Pointing device having rotational sensing mechanisms|
JP4040825B2|2000-06-12|2008-01-30|富士フイルム株式会社|Image capturing apparatus and distance measuring method|
US6870152B2|2002-02-01|2005-03-22|Georgia Tech Research Corporation|Segmented photodetectors for detection and compensation of modal dispersion in optical waveguides|
US6838715B1|2002-04-30|2005-01-04|Ess Technology, Inc.|CMOS image sensor arrangement with reduced pixel light shadowing|
US20070146318A1|2004-03-11|2007-06-28|Mobisol Inc.|Pointing device with an integrated optical structure|
EP1622200A1|2004-07-26|2006-02-01|CSEM Centre Suisse d'Electronique et de Microtechnique SA|Solid-state photodetector pixel and photodetecting method|
US7214920B2|2005-05-06|2007-05-08|Micron Technology, Inc.|Pixel with spatially varying metal route positions|
US7683407B2|2005-08-01|2010-03-23|Aptina Imaging Corporation|Structure and method for building a light tunnel for use with imaging devices|
US7620309B2|2006-04-04|2009-11-17|Adobe Systems, Incorporated|Plenoptic camera|
CN101558367A|2006-12-05|2009-10-14|索尼爱立信移动通讯有限公司|Method and system for detecting movement of an object|
US9778276B2|2007-11-20|2017-10-03|Hewlett-Packard Development Company, L.P.|Liquid handling device|
US8432372B2|2007-11-30|2013-04-30|Microsoft Corporation|User input using proximity sensing|
US8282485B1|2008-06-04|2012-10-09|Zhang Evan Y W|Constant and shadowless light source|
US8146020B2|2008-07-24|2012-03-27|Qualcomm Incorporated|Enhanced detection of circular engagement gesture|
TW201007531A|2008-08-01|2010-02-16|Yu-Hsiang Huang|Gesture detecting method of capacitive touch pad is disclosed|
TWI357370B|2008-08-20|2012-02-01|
US7960699B2|2008-10-22|2011-06-14|Eminent Electronic Technology Corp.|Light detection circuit for ambient light and proximity sensor|
US8344325B2|2009-05-22|2013-01-01|Motorola Mobility Llc|Electronic device with sensing assembly and method for detecting basic gestures|
US20100320552A1|2009-06-19|2010-12-23|Pixart Imaging Inc.|CMOS Image Sensor|
TW201106614A|2009-08-10|2011-02-16|Holtek Semiconductor Inc|Low-frequency amplifier and PIR detector|
US8072442B2|2010-02-09|2011-12-06|Sharp Kabushiki Kaisha|Electrically switchable field of view for embedded light sensor|
US20110310005A1|2010-06-17|2011-12-22|Qualcomm Incorporated|Methods and apparatus for contactless gesture recognition|
US8461513B2|2010-09-28|2013-06-11|Texas Advanced Optoelectronic Solutions, Inc.|Method and apparatus for device with minimized optical cross-talk|
US9229581B2|2011-05-05|2016-01-05|Maxim Integrated Products, Inc.|Method for detecting gestures using a multi-segment photodiode and one or fewer illumination sources|
US20140035812A1|2011-05-05|2014-02-06|Maxim Integrated Products, Inc.|Gesture sensing device|US9746544B2|2008-12-03|2017-08-29|Analog Devices, Inc.|Position measurement systems using position sensitive detectors|
US9285459B2|2008-05-09|2016-03-15|Analog Devices, Inc.|Method of locating an object in 3D|
WO2010138385A1|2009-05-27|2010-12-02|Analog Devices, Inc.|Multiuse optical sensor|
US9229581B2|2011-05-05|2016-01-05|Maxim Integrated Products, Inc.|Method for detecting gestures using a multi-segment photodiode and one or fewer illumination sources|
US9702690B2|2011-12-19|2017-07-11|Analog Devices, Inc.|Lens-less optical position measuring sensor|
US9772404B2|2013-01-31|2017-09-26|Sharp Kabushiki Kaisha|Optical sensor and electronic device|
US9431440B2|2013-03-14|2016-08-30|Maxim Integrated Products, Inc.|Optical sensor|
US9582078B1|2013-06-28|2017-02-28|Maxim Integrated Products, Inc.|Integrated touchless joystick-type controller|
US9489051B2|2013-07-01|2016-11-08|Blackberry Limited|Display navigation using touch-less gestures|
US9323336B2|2013-07-01|2016-04-26|Blackberry Limited|Gesture detection using ambient light sensors|
US9256290B2|2013-07-01|2016-02-09|Blackberry Limited|Gesture detection using ambient light sensors|
US9398221B2|2013-07-01|2016-07-19|Blackberry Limited|Camera control using ambient light sensors|
US9342671B2|2013-07-01|2016-05-17|Blackberry Limited|Password by touch-less gesture|
US9367137B2|2013-07-01|2016-06-14|Blackberry Limited|Alarm operation by touch-less gesture|
US9423913B2|2013-07-01|2016-08-23|Blackberry Limited|Performance control of ambient light sensors|
US9405461B2|2013-07-09|2016-08-02|Blackberry Limited|Operating a device using touchless and touchscreen gestures|
US9304596B2|2013-07-24|2016-04-05|Blackberry Limited|Backlight for touchless gesture detection|
US9465448B2|2013-07-24|2016-10-11|Blackberry Limited|Backlight for touchless gesture detection|
US9194741B2|2013-09-06|2015-11-24|Blackberry Limited|Device having light intensity measurement in presence of shadows|
US9383434B2|2013-10-01|2016-07-05|Heptagon Micro Optics Pte. Ltd.|Compact opto-electronic sensor modules for detecting gestures or other movements by a user|
EP2860612B1|2013-10-04|2019-04-24|ams AG|Optical sensor arrangement and method for gesture detection|
KR102098400B1|2013-10-28|2020-04-08|매그나칩 반도체 유한회사|Gesture cell and gesture sensor having the same|
JP6223779B2|2013-10-28|2017-11-01|シャープ株式会社|Photodetector and electronic device|
CN104700603A|2015-03-11|2015-06-10|四川和芯微电子股份有限公司|System for remotely controlling intelligent mobile terminal|
GB2529567B|2015-09-22|2016-11-23|X-Fab Semiconductor Foundries Ag|Light shield for light sensitive elements|
法律状态:
2019-01-15| MM01| Lapse because of not paying annual fees|Effective date: 20180531 |
优先权:
申请号 | 申请日 | 专利标题
US201161483034P| true| 2011-05-05|2011-05-05|
[返回顶部]